首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1951篇
  免费   300篇
  国内免费   329篇
电工技术   80篇
综合类   156篇
化学工业   55篇
金属工艺   16篇
机械仪表   64篇
建筑科学   63篇
矿业工程   10篇
能源动力   30篇
轻工业   27篇
水利工程   37篇
石油天然气   24篇
武器工业   8篇
无线电   218篇
一般工业技术   136篇
冶金工业   80篇
原子能技术   34篇
自动化技术   1542篇
  2024年   20篇
  2023年   57篇
  2022年   84篇
  2021年   83篇
  2020年   75篇
  2019年   74篇
  2018年   67篇
  2017年   67篇
  2016年   74篇
  2015年   105篇
  2014年   110篇
  2013年   174篇
  2012年   140篇
  2011年   128篇
  2010年   88篇
  2009年   122篇
  2008年   130篇
  2007年   148篇
  2006年   129篇
  2005年   96篇
  2004年   100篇
  2003年   62篇
  2002年   55篇
  2001年   55篇
  2000年   33篇
  1999年   24篇
  1998年   29篇
  1997年   31篇
  1996年   17篇
  1995年   25篇
  1994年   31篇
  1993年   21篇
  1992年   13篇
  1991年   6篇
  1990年   16篇
  1989年   11篇
  1988年   5篇
  1987年   4篇
  1986年   6篇
  1985年   11篇
  1984年   10篇
  1983年   7篇
  1982年   3篇
  1981年   8篇
  1979年   5篇
  1978年   2篇
  1977年   4篇
  1975年   4篇
  1964年   2篇
  1959年   2篇
排序方式: 共有2580条查询结果,搜索用时 0 毫秒
1.
A Generalised Additive Modelling (GAM) approach is applied to prediction of both particulate and dissolved nutrient concentrations in a wet-tropical river (the Fitzroy River, Queensland, Australia). In addition to covariant terms considered in previous work (i.e. flow, discounted flow and a rising-falling limb term), we considered several new potential covariates: meteorological and hydrological variables that are routinely monitored, available in near-real time, and were considered to have potential predictive power. Of the additional terms considered, only flows from three tributaries of the Fitzroy River (namely, the Nogoa, Comet and Isaac Rivers) were found to significantly improve the model. Inclusion of one or more of these additional flow terms greatly improved results for dissolved nitrogen and dissolved phosphorus concentrations, which were not otherwise amenable to prediction. In particular, the Nogoa sub-catchment, dominated by pasture for cattle, was found to be important in determining dissolved inorganic nitrogen and phosphorus concentrations reaching the river mouth. This insight may direct further research, including future refinement of processed-based catchment models. The GAMs described here are used to provide near real-time river boundary conditions for a complex coupled hydrodynamic and biogeochemical model of the Great Barrier Reef Lagoon, and can be coupled with a forecasting hydrological model to allow integrated forecasting simulations of the catchment to coast system.  相似文献   
2.
现阶段的语义解析方法大部分都基于组合语义,这类方法的核心就是词典。词典是词汇的集合,词汇定义了自然语言句子中词语到知识库本体中谓词的映射。语义解析一直面临着词典中词汇覆盖度不够的问题。针对此问题,该文在现有工作的基础上,提出了基于桥连接的词典学习方法,该方法能够在训练中自动引入新的词汇并加以学习,为了进一步提高新学习到的词汇的准确度,该文设计了新的词语—二元谓词的特征模板,并使用基于投票机制的核心词典获取方法。该文在两个公开数据集(WebQuestions和Free917)上进行了对比实验,实验结果表明,该文方法能够学习到新的词汇,提高词汇的覆盖度,进而提升语义解析系统的性能,特别是召回率。  相似文献   
3.
Organic devices like organic light emitting diodes (OLEDs) or organic solar cells degrade fast when exposed to ambient air. Hence, thin-films acting as permeation barriers are needed for their protection. Atomic layer deposition (ALD) is known to be one of the best technologies to reach barriers with a low defect density at gentle process conditions. As well, ALD is reported to be one of the thinnest barrier layers, with a critical thickness – defining a continuous barrier film – as low as 5–10 nm for ALD processed Al2O3. In this work, we investigate the barrier performance of Al2O3 films processed by ALD at 80 °C with trimethylaluminum and ozone as precursors. The coverage of defects in such films is investigated on a 5 nm thick Al2O3 film, i.e. below the critical thickness, on calcium using atomic force microscopy (AFM). We find for this sub-critical thickness regime that all spots giving raise to water ingress on the 20 × 20 μm2 scan range are positioned on nearly flat surface sites without the presence of particles or large substrate features. Hence below the critical thickness, ALD leaves open or at least weakly covered spots even on feature-free surface sites. The thickness dependent performance of these barrier films is investigated for thicknesses ranging from 15 to 100 nm, i.e. above the assumed critical film thickness of this system. To measure the barrier performance, electrical calcium corrosion tests are used in order to measure the water vapor transmission rate (WVTR), electrodeposition is used in order to decorate and count defects, and dark spot growth on OLEDs is used in order to confirm the results for real devices. For 15–25 nm barrier thickness, we observe an exponential decrease in defect density with barrier thickness which explains the likewise observed exponential decrease in WVTR and OLED degradation rate. Above 25 nm, a further increase in barrier thickness leads to a further exponential decrease in defect density, but an only sub-exponential decrease in WVTR and OLED degradation rate. In conclusion, the performance of the thin Al2O3 permeation barrier is dominated by its defect density. This defect density is reduced exponentially with increasing barrier thickness for alumina thicknesses of up to at least 25 nm.  相似文献   
4.
本文基于静态相关性分析和动态调整相结合的方法,提出了一种逻辑程序的执行模型,它不仅开发了“与“并行,同进也开发了一定的“或“并行,从而有效地加速了逻辑程序的执行。  相似文献   
5.
We present a method for recovering from syntax errors encountered during parsing. The method provides a form of minimum distance repair, has linear time complexity, and is completely automatic. A formal method is presented for evaluating the performance of error recovery methods, based on global minimum-distance error correction. The minimum-distance error recovery method achieves a theoretically best performance on 80% of Pascal programs in the weighted Ripley-Druseikis collection. Comparisons of performance with other error recovery methods are given.  相似文献   
6.
Intensional negative adjectives alleged , artificial , fake , false , former , and toy are unusual adjectives that depending on context may or may not be restricting functions. A formal theory of their semantics, pragmatics, and context that uniformly accounts for their complex mathematical and computational characteristics and captures some peculiarities of individual adjectives is presented.
Such adjectives are formalized as new concept builders, negation‐like functions that operate on the values of intensional properties of the concepts denoted by their arguments and yield new concepts whose intensional properties have values consistent with the negation of the old values. Understanding these new concepts involves semantics, pragmatics and context‐dependency of natural language. It is argued that intensional negative adjectives can be viewed as a special‐purpose, weaker, conntext‐dependent negationin natural language. The theory explains and predicts many inferences licensed by expressions involving such adjectives. Implementation of sample examples demonstrates its computational feasibility. Computation of context‐dependent interpretation is discussed.
The theory allows one to enhance a knowledge representation system with similar concept building, negation‐like, context‐dependent functions, the availability of which appears to be a distinct characteristic of natural languages.  相似文献   
7.
The processing of images obtained from satellites often involves highly repetitive calculations on very large amounts of data. This processing is extremely time consuming when these calculations are performed on sequential machines. Parallel computers are well suited to handling computationally expensive operations such as higher order interpolations on large data sets. This paper decribes work undertaken to develop parallel implementations of a set of resampling procedures on an Alliant VFX/4. Each resampling procedure implemented has been optimised in three stages. First, the algorithm has been restructured, where two-dimensional resampling is performed by two one-dimensional resampling operations. Second, each procedure has been reprogrammed in such a way that the autoparallelisation provided by the FX/Fortran compiler has been exploited. Thirdly, data dependent analysis of each procedure has been performed in order to achieve full optimization of each procedure; each procedure has been restructured where appropriate to circumvent vectorisation and concurrency inhibiting data dependencies. The nature and extent of the code optimization achieved for each procedure is presented in this paper. The original code for the most computationally expensive procedure, as targeted at a sequential machine, was found to have an execution time of 4900 seconds on the Alliant VFX/4 when compiled with regular compiler optimization options. Following algorithmic redesigning and reprogramming of the code, as indicated in stage 1 and stage 2, the execution time was reduced to 248 seconds. Restructuring of the code following data dependency analysis indicated in stage 3 in order to avoid data dependencies and allow concurrency and vectorisation, further reduced execution time to 162 seconds. The consequence of this work is that higher-order resampling methods which had not previous been practical are now routinely performed on the Alliant VFX/4 at the University of Dundee.  相似文献   
8.
Reviews the book, The chemically dependent: Phases of treatment and recovery edited by Barbara C. Wallace (see record 1992-98403-000). While this book is ambitious, interesting, educational, and useful, it is also disappointing, repetitious, and incomplete. Because it tries to accomplish so much, it may appear to have succeeded too little. This book is organized around, and explicative of, several basic ideas which might have been controversial if not heretical had this book been published ten years ago. Section I, purporting to link specific "phases of recovery" to particular forms and functions of treatment, will certainly be useful for novice clinicians but falls short of its overstated goals and is thereby disappointing. Section II is a collection of moderately redundant chapters describing the etiology and treatment of substance abusers from the viewpoints of psychoanalytic, psychodynamic, ego psychology, and object-relations theorists and therapists. Section III focuses on cognitive-behavioral, self-help, and relapse-prevention treatments. Section IV is quite uneven in quality of writing and applicability of content, and could have benefited from closer editorial scrutiny or selectivity. The final section focuses on special needs of particular subpopulations of substance abusers: African-Americans, prison inmates, HIV/AIDS patients, persons who are homeless, those who have been sexually and physically abused, and others. According to the reviewer this is not the best book on substance abuse treatment, but it does present some clinically useful ideas and it is worth reading. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
9.
Evidence from 3 experiments reveals interference effects from structural relationships that are inconsistent with any grammatical parse of the perceived input. Processing disruption was observed when items occurring between a head and a dependent overlapped with either (or both) syntactic or semantic features of the dependent. Effects of syntactic interference occur in the earliest online measures in the region where the retrieval of a long-distance dependent occurs. Semantic interference effects occur in later online measures at the end of the sentence. Both effects endure in offline comprehension measures, suggesting that interfering items participate in incorrect interpretations that resist reanalysis. The data are discussed in terms of a cue-based retrieval account of parsing, which reconciles the fact that the parser must violate the grammar in order for these interference effects to occur. Broader implications of this research indicate a need for a precise specification of the interface between the parsing mechanism and the memory system that supports language comprehension. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
10.
Parallel parsing is currently receiving attention but there is little discussion about the adaptation of sequential error handling techniques to these parallel algorithms. We describe a noncorrecting error handler implemented with a parallel LR substring parser. The parser used is a parallel version of Cormack's LR substring parser. The applicability of noncorrecting error handling for parallel parsing is discussed. The error information provided for a standard set of 118 erroneous Pascal programs is analysed. The programs are run on the sequential LR substring parser.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号